SOR-Met hods for the Eigenvalue Problem with Large Sparse Matrices

نویسندگان

  • Axel Ruhe
  • AXEL RUHE
چکیده

The eigenvalue problem Ax = \Bx, where A and B are large and sparse symmetric matrices, is considered. An iterative algorithm for computing the smallest eigenvalue and its corresponding eigenvector, based on the successive overrelaxation splitting of the matrices, is developed, and its global convergence is proved. An expression for the optimal overrelaxation factor is found in the case where A and B are two-cyclic (property A). Further, it is shown that this SOR algorithm is the first order approximation to the coordinate relaxation algorithm, which implies that the same overrelaxation can be applied to this latter algorithm. Several numerical tests are reported. It is found that the SOR method is more effective than coordinate relaxation. If the separation of the eigenvalues is not too bad, the SOR algorithm has a fast rate of convergence, while, for problems with more severe clustering, the c-g or Lanczos algorithms should be preferred.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the nonnegative inverse eigenvalue problem of traditional matrices

In this paper, at first for a given set of real or complex numbers $sigma$ with nonnegative summation, we introduce some special conditions that with them there is no nonnegative tridiagonal matrix in which $sigma$ is its spectrum. In continue we present some conditions for existence such nonnegative tridiagonal matrices.

متن کامل

Preconditioning for solving Hermite Collocation by the Bi-CGSTAB

Explicit pre/post conditioning of the large, sparse and non-symmetric system of equations, arising from the discretization of the Dirichlet Poisson’s Boundary Value Problem (BVP) by the Hermite Collocation method is the problem considered herein. Using the 2-cyclic (red-black) structure of the Collocation coefficient matrix, we investigate the eigenvalue distribution of its preconditioned analo...

متن کامل

Some new restart vectors for explicitly restarted Arnoldi method

The explicitly restarted Arnoldi method (ERAM) can be used to find some eigenvalues of large and sparse matrices. However, it has been shown that even this method may fail to converge. In this paper, we present two new methods to accelerate the convergence of ERAM algorithm. In these methods, we apply two strategies for the updated initial vector in each restart cycles. The implementation of th...

متن کامل

A Jacobi–Davidson type method for the product eigenvalue problem

We propose a Jacobi–Davidson type technique to compute selected eigenpairs of the product eigenvalue problem Am · · ·A1x = λx, where the matrices may be large and sparse. To avoid difficulties caused by a high condition number of the product matrix, we split up the action of the product matrix and work with several search spaces. We generalize the Jacobi–Davidson correction equation, and the ha...

متن کامل

An Iterative Method for Computing the Pseudospectral Abscissa for a Class of Nonlinear Eigenvalue Problems

where A1, . . . , Am are given n × n matrices and the functions p1, . . . , pm are assumed to be entire. This does not only include polynomial eigenvalue problems but also eigenvalue problems arising from systems of delay differential equations. Our aim is to compute the -pseudospectral abscissa, i.e. the real part of the rightmost point in the -pseudospectrum, which is the complex set obtained...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010